58 research outputs found

    Quantum Circuits for the Unitary Permutation Problem

    Full text link
    We consider the Unitary Permutation problem which consists, given nn unitary gates U1,,UnU_1, \ldots, U_n and a permutation σ\sigma of {1,,n}\{1,\ldots, n\}, in applying the unitary gates in the order specified by σ\sigma, i.e. in performing Uσ(n)Uσ(1)U_{\sigma(n)}\ldots U_{\sigma(1)}. This problem has been introduced and investigated by Colnaghi et al. where two models of computations are considered. This first is the (standard) model of query complexity: the complexity measure is the number of calls to any of the unitary gates UiU_i in a quantum circuit which solves the problem. The second model provides quantum switches and treats unitary transformations as inputs of second order. In that case the complexity measure is the number of quantum switches. In their paper, Colnaghi et al. have shown that the problem can be solved within n2n^2 calls in the query model and n(n1)2\frac{n(n-1)}2 quantum switches in the new model. We refine these results by proving that nlog2(n)+Θ(n)n\log_2(n) +\Theta(n) quantum switches are necessary and sufficient to solve this problem, whereas n22n+4n^2-2n+4 calls are sufficient to solve this problem in the standard quantum circuit model. We prove, with an additional assumption on the family of gates used in the circuits, that n2o(n7/4+ϵ)n^2-o(n^{7/4+\epsilon}) queries are required, for any ϵ>0\epsilon >0. The upper and lower bounds for the standard quantum circuit model are established by pointing out connections with the permutation as substring problem introduced by Karp.Comment: 8 pages, 5 figure

    Translation, validity and reliability of the British Sign Language (BSL) version of the EQ-5D-5L.

    Get PDF
    PURPOSE: To translate the health questionnaire EuroQol EQ-5D-5L into British Sign Language (BSL), to test its reliability with the signing Deaf population of BSL users in the UK and to validate its psychometric properties. METHODS: The EQ-5D-5L BSL was developed following the international standard for translation required by EuroQol, with additional agreed features appropriate to a visual language. Data collection used an online platform to view the signed (BSL) version of the tests. The psychometric testing included content validity, assessed by interviewing a small sample of Deaf people. Reliability was tested by internal consistency of the items and test-retest, and convergent validity was assessed by determining how well EQ-5D-5L BSL correlates with CORE-10 BSL and CORE-6D BSL. RESULTS: The psychometric properties of the EQ-5D-5L BSL are good, indicating that it can be used to measure health status in the Deaf signing population in the UK. Convergent validity between EQ-5D-5L BSL and CORE-10 BSL and CORE-6D BSL is consistent, demonstrating that the BSL version of EQ-5D-5L is a good measure of the health status of an individual. The test-retest reliability of EQ-5D-5L BSL, for each dimension of health, was shown to have Cohen's kappa values of 0.47-0.61; these were in the range of moderate to good and were therefore acceptable. CONCLUSIONS: This is the first time EQ-5D-5L has been translated into a signed language for use with Deaf people and is a significant step forward towards conducting studies of health status and cost-effectiveness in this population

    Surfactant Protein D modulates allergen particle uptake and inflammatory response in a human epithelial airway model

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Allergen-containing subpollen particles (SPP) are released from whole plant pollen upon contact with water or even high humidity. Because of their size SPP can preferentially reach the lower airways where they come into contact with surfactant protein (SP)-D. The aim of the present study was to investigate the influence of SP-D in a complex three-dimensional human epithelial airway model, which simulates the most important barrier functions of the epithelial airway. The uptake of SPP as well as the secretion of pro-inflammatory cytokines was investigated.</p> <p>Methods</p> <p>SPP were isolated from timothy grass and subsequently fluorescently labeled. A human epithelial airway model was built by using human Type II-pneumocyte like cells (A549 cells), human monocyte derived macrophages as well as human monocyte derived dendritic cells. The epithelial cell model was incubated with SPP in the presence and absence of surfactant protein D. Particle uptake was evaluated by confocal microscopy and advanced computer-controlled analysis. Finally, human primary CD4<sup>+ </sup>T-Cells were added to the epithelial airway model and soluble mediators were measured by enzyme linked immunosorbent assay or bead array.</p> <p>Results</p> <p>SPP were taken up by epithelial cells, macrophages, and dendritic cells. This uptake coincided with secretion of pro-inflammatory cytokines and chemokines. SP-D modulated the uptake of SPP in a cell type specific way (e.g. increased number of macrophages and epithelial cells, which participated in allergen particle uptake) and led to a decreased secretion of pro-inflammatory cytokines.</p> <p>Conclusion</p> <p>These results display a possible mechanism of how SP-D can modulate the inflammatory response to inhaled allergen.</p

    Virological failure after 1 year of first-line ART is not associated with HIV minority drug resistance in rural Cameroon

    Get PDF
    Objectives The aim of this study was to describe clinical and virological outcomes in therapy-naive HIV-1-positive patients treated in a routine ART programme in rural Cameroon. Methods In a prospective cohort, 300 consecutive patients starting first-line ART were enrolled and followed for 12 months. Among 238 patients with available viral load data at Month 12, logistic regression was used to analyse risk factors for virological failure (≥1000 HIV RNA copies/mL) including clinical, immunological and virological parameters, as well as data on drug adherence. Population sequencing was performed to detect the presence of drug-resistance mutations in patients with virological failure at Month 12; minority drug-resistance mutations at baseline were analysed using next-generation sequencing in these patients and matched controls. Results At Month 12, 38/238 (16%) patients experienced virological failure (≥1000 HIV RNA copies/mL). Patients with virological failure were younger, had lower CD4 cell counts and were more often WHO stage 3 or 4 at baseline. Sixty-three percent of patients with virological failure developed at least one drug-resistance mutation. The M184V (n = 18) and K103N (n = 10) mutations were most common. At baseline, 6/30 patients (20%) experiencing virological failure and 6/35 (17%) matched controls had evidence of minority drug-resistance mutations using next-generation sequencing (P = 0.77). Lower CD4 count at baseline (OR per 100 cells/mm3 lower 1.41, 95% CI 1.02-1.96, P = 0.04) and poorer adherence (OR per 1% lower 1.05, 95% CI 1.02-1.08, P < 0.001) were associated with a higher risk of virological failure. Unavailability of ART at the treatment centre was the single most common cause for incomplete adherence. Conclusions Virological failure after 1 year of ART was not associated with minority drug resistance at baseline but with incomplete adherence. Strategies to assure adherence and uninterrupted drug supplies are pivotal factors for therapy succes

    Efficient convexity and domination algorithms for fine- and medium-grain hypercube computers

    Full text link
    This paper gives hypercube algorithms for some simple problems involving geometric properties of sets of points. The properties considered emphasize aspects of convexity and domination. Efficient algorithms are given for both fine- and medium-grain hypercube computers, including a discussion of implementation, running times and results on an Intel iPSC hypercube, as well as theoretical results. For both serial and parallel computers, sorting plays an important role in geometric algorithms for determining simple properties, often being the dominant component of the running time. Since the time required to sort data on a hypercube computer is still not fully understood, the running times of some of our algorithms for unsorted data are not completely determined. For both the fine- and medium-grain models, we show that faster expected-case running time algorithms are possible for point sets generated randomly. Our algorithms are developed for sets of planar points, with several of them extending to sets of points in spaces of higher dimension.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41352/1/453_2005_Article_BF01758751.pd

    Computing convexity properties of images on a pyramid computer

    Full text link
    We present efficient parallel algorithms for using a pyramid computer to determine convexity properties of digitized black/white pictures and labeled figures. Algorithms are presented for deciding convexity, identifying extreme points of convex hulls, and using extreme points in a variety of fashions. For a pyramid computer with a base of n simple processing elements arranged in an n 1/2 × n 1/2 square, the running times of the algorithms range from Θ(log n ) to find the extreme points of a convex figure in a digitized picture, to Θ( n 1/6 ) to find the diameter of a labeled figure, Θ( n 1/4 log n ) to find the extreme points of every figure in a digitized picture, to Θ( n 1/2 ) to find the extreme points of every labeled set of processing elements. Our results show that the pyramid computer can be used to obtain efficient solutions to nontrivial problems in image analysis. We also show the sensitivity of efficient pyramid-computer algorithms to the rate at which essential data can be compressed. Finally, we show that a wide variety of techniques are needed to make full and efficient use of the pyramid architecture.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/41351/1/453_2005_Article_BF01759066.pd

    Divide-and-conquer algorithms on the hypercube

    No full text

    Reconfigurable Mesh Algorithms For Fundamental Data Manipulation Operations

    No full text
    : Reconfigurable mesh (RMESH) algorithms for several fundamental operations are developed. These operations include data broadcast, prefix sum, data sum, ranking, shift, data accumulation, consecutive sum, adjacent sum, sorting, random access read, and random access write. Keywords: reconfigurable mesh computer, parallel algorithms, data manipulation. 1 Introduction Recently, several similar reconfigurable mesh (RMESH) architectures have been proposed [MILL88abc, LI89ab, BEN90]. It has been demonstrated that these architectures are often very easy to program and that in many cases it is possible to obtain constant time algorithms that use a polynomial number of processors for problems that are not so solvable using the PRAM model [BEN90, MILL88a, JENQ91b, WANG90ab]. For instance, the parity of n bits can be found in O (1) time on a reconfigurable mesh with n 2 processors while it takes W(logn/loglogn) time to do this on every CRCW PRAM with a polynomial number of processors [BEAM87..

    By their fruits shall ye know them

    No full text
    corecore